Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Doubly feature-weighted fuzzy support vector machine
Yunzhi QIU, Tinghua WANG, Xiaolu DAI
Journal of Computer Applications    2022, 42 (3): 683-687.   DOI: 10.11772/j.issn.1001-9081.2021040760
Abstract319)   HTML15)    PDF (434KB)(101)       Save

Concerning the shortcoming that the current feature-weighted Fuzzy Support Vector Machines (FSVM) only consider the influence of feature weights on the membership functions but ignore the application of feature weights to the kernel functions calculation during sample training, a new FSVM algorithm that considers the influence of feature weights on the membership function and the kernel function calculation simultaneously was proposed, namely Doubly Feature-Weighted FSVM (DFW-FSVM). Firstly, relative weight of each feature was calculated by using Information Gain (IG). Secondly, the weighted Euclidean distance between the sample and the class center was calculated in the original space based on the feature weights, and then the membership function was constructed by applying the weighted Euclidean distance; at the same time, the feature weights were applied to the calculation of the kernel function in the sample training process. Finally, DFW-FSVM algorithm was constructed according to the weighted membership functions and kernel functions. In this way, DFW-FSVM is able to avoid being dominated by trivial relevant or irrelevant features. The comparative experiments were carried out on eight UCI datasets, and the results show that compared with the best results of SVM, FSVM, Feature-Weighted SVM (FWSVM), Feature-Weighted FSVM (FWFSVM) and FSVM based on Centered Kernel Alignment (CKA-FSVM) , the accuracy and F1 value of the DFW-FSVM algorithm increase by 2.33 and 5.07 percentage points, respectively, indicating that the proposed DFW-FSVM has good classification performance.

Table and Figures | Reference | Related Articles | Metrics